A language model combining trigrams and stochastic context-free grammars
نویسندگان
چکیده
We propose a class trigram language model in which each class is specified by a stochastic context-free grammar. We show how to estimate the parameters of the model, and how to smooth these estimates. We present experimental perplexity and speech recognition results.
منابع مشابه
A Language Model Combining Trigrams and Context-Free Grammars
We present a class trigram language model in which each class is specified by a probabilistic context-free grammar. We show how to estimate the parameters of the model, and how to smooth these estimates. Experimental perplexity and speech recognition results are presented.
متن کاملIntegration of two stochastic context-free grammars
Some problems in speech and natural language processing involve combining two information sources each modeled by a stochastic context-free grammar. Such cases include parsing the output of a speech recognizer by using a contextfree language model, finding the best solution among all the possible ones in language generation, preserving ambiguity in machine translation. In these cases usually at...
متن کاملStochastic Categorial Grammars
Statistical methods have turned out to be quite successful in natural language processing. During the recent years, several models of stochastic grammars have been proposed, including models based on lexicalised context-free grammars [3], tree adjoining grammars [15], or dependency grammars [2, 5]. In this exploratory paper, we propose a new model of stochastic grammar, whose originality derive...
متن کاملCombining Grammars for Improved Learning Combining Grammars for Improved Learning
We report experimental work on improving learning methods for probabilis-tic context-free grammars (PCFGs). From stacked regression we borrow the basic idea of combining grammars. Smoothing, a domain-independent method for combining grammars, does not ooer noticeable performance gains. However , PCFGs allow much tighter, domain-dependent coupling, and we show that this may be exploited for sign...
متن کاملDialog-context dependent language modeling combining n-grams and stochastic context-free grammars
In this paper, we present our research on dialog dependent language modeling. In accordance with a speech (or sentence) production model in a discourse we split language modeling into two components; namely, dialog dependent concept modeling and syntactic modeling. The concept model is conditioned on the last question prompted by the dialog system and it is structured using n-grams. The syntact...
متن کامل